Layer (type) Output Shape Param #
==============================================================
random_rotation_2 (RandomRot (None, 100, 100, 3) 0
_________________________________________________________________
conv2d_10 (Conv2D) (None, 98, 98, 32) 896
_________________________________________________________________
max_pooling2d_8 (MaxPooling2 (None, 49, 49, 32) 0
_________________________________________________________________
conv2d_11 (Conv2D) (None, 47, 47, 50) 14450
_________________________________________________________________
max_pooling2d_9 (MaxPooling2 (None, 23, 23, 50) 0
_________________________________________________________________
conv2d_12 (Conv2D) (None, 21, 21, 50) 22550
_________________________________________________________________
max_pooling2d_10 (MaxPooling (None, 10, 10, 50) 0
_________________________________________________________________
conv2d_13 (Conv2D) (None, 8, 8, 50) 22550
_________________________________________________________________
max_pooling2d_11 (MaxPooling (None, 4, 4, 50) 0
_________________________________________________________________
conv2d_14 (Conv2D) (None, 2, 2, 50) 22550
_________________________________________________________________
flatten_2 (Flatten) (None, 200) 0
_________________________________________________________________
dense_6 (Dense) (None, 100) 20100
_________________________________________________________________
dropout_4 (Dropout) (None, 100) 0
_________________________________________________________________
dense_7 (Dense) (None, 50) 5050
_________________________________________________________________
dropout_5 (Dropout) (None, 50) 0
_________________________________________________________________
dense_8 (Dense) (None, 12) 612
==============================================================
Total params: 108,758 Trainable params: 108,758 Non-trainable params: 0
Organização da Rede
Tamanho da janela de convolução 2D
Tamanho da saida (mapa de ativação)
Fonte: stanford.edu/~shervine/teaching/cs-230/cheatsheet-convolutional-neural-networks
Expandir base de treinamento:
RandomRotation (0.5) =
[-50% * 2pi, 50% * 2pi]
Ocultando base na camada de
neurônios:
Dropout (.05) = 5 %
Épocas: 50